|
The entropic vector or entropic function is a concept arising in information theory. Shannon's information entropy measures and their associated identities and inequalities (both constrained and unconstrained) have received a lot of attention over the past from the time Shannon introduced his concept of Information Entropy. A lot of inequalities and identities have been found and are available in standard Information Theory texts. But recent researchers have laid focus on trying to find all possible identities and inequalities (both constrained and unconstrained) on such entropies and characterize them. Entropic vector lays down the basic framework for such a study. ==Definition== Let be random variables, with A vector ''h'' in is an entropic vector of order if and only if there exists a tuple with associated vector where y . The set of all entropic vectors of order is denoted by All the properties of entropic functions can be transposed to entropic vectors: is continuous Given a deterministic random variable , we have Given , there exists a random variable such that Given a probability distribution on , we have 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Entropic vector」の詳細全文を読む スポンサード リンク
|